专利摘要:
Abstract System and Method for Biometric Authentication in Connection with Camera Equipments The present invention relates generally to the use of biometric technology for authentication and identification, and more particularly for contactless authentication and user identification based solutions via computers. , such as mobile devices, to selectively allow or prevent access to vasious resoueces, the present invention for authentication and / or identification is performed using an image or set of palm images of a person by a process involving The main follwing steps: (1) Palm area detection using local classifiers; (2) estracting characteristics of the region (s) of interest; and (3) calculating the corresponding score against user models stored in a database, which can be dynamically augumented through a learning process.
公开号:BR112015004867A2
申请号:R112015004867
申请日:2013-09-05
公开日:2020-04-22
发明作者:Perold Adam;Waghmare Sagar;Wang Yang;Lecun Yann
申请人:Element Inc;
IPC主号:
专利说明:

SYSTEM AND METHOD FOR BIOMETRIC AUTHENTICATION IN SET WITH DEVICES EQUIPPED WITH CAMERA
CROSS REFERENCE TO RELATED APPLICATIONS [001] This application is based on and claims the benefit of the filing date of provisional patent application n-US 61 / 696,820, filed on September 5, 2012, the entire disclosure of which is incorporated into this document as a reference.
BACKGROUND OF THE INVENTION
FIELD OF THE INVENTION [002] The present invention relates in general to the use of biometric technology for authentication and identification, and more particularly for solutions based on no contact to authenticate and identify users, through computers, such as mobile devices, to allow or selectively deny access to varied resources. In the present invention, authentication and / or identification is performed using an image or a set of images of an individual's palm through a process that involves the following main steps: (1) detecting the palm area through the use of local classifiers; (2) extracting attributes from the region (s) of interest; and (3) compute the compatibility score in comparison to the user models stored in a database, which can be dynamically increased through a learning process. DISCUSSION OF RELATED TECHNIQUE [003] Mobile devices, such as smart phones, tablets and notebooks, have become widely adopted and used by many people on a daily basis.
2/33
These devices have become increasingly powerful and as developers create more and more applications and services that operate on them, these devices become even more established in our daily lives. These mobile devices not only provide a powerful computing platform in their own right, but also provide connectivity to a set of virtually unlimited services, applications and data available on remote platforms which are typically accessed over a wireless link to a cell location and then transferred back to the main internet structure. In addition to accessing these remote platforms, mobile devices also have the ability to connect with other mobile devices over short and long-range wireless connections.
[004] Perhaps more prominently, the continued increase in the insertion of these devices together with the progressive reduction in costs associated with component parts in these devices has resulted in the devices becoming available with greater capacities while remaining affordable for many users . For example, as a result of reduced component part costs and the development of more powerful software, a substantial number of smart phones today include powerful cameras, which can take extraordinarily detailed photos at eight resolution
megapixels or more . [005] One question important that arises in the context in devices furniture and comprehensive use of the same in together with so many features and the need in
3/33 their interaction with so many different resources is the need to control access to each of these resources so that only those individuals or devices that should be allowed to access the applicable resources have, in fact, such capacity. In typical cases, resource access is controlled by entering numeric / text strings, such as user IDs and passwords. For example, a smart phone user may be asked to enter a four-digit code before being allowed to access any functionality on the device. In addition, each local application or other feature on the device may request that
the user insert an or more chains in characters numeric / text before in get access to re course. In that case, the correct data (ID user, password, etc.) can be stored atmemory ofdevice.
Alternatively, for access to resources (applications, data, communication capabilities, etc.) that are located remotely from the device, the user and / or the device may be required to transmit a set of numeric / text strings correct for the remote resource which, in turn, verifies that the transmitted data is compatible with the correct data before allowing access to the resource.
[006] As can be imagined, for a typical smart phone user, for example, there are numerous disadvantages with the aforementioned authentication and identification techniques. First, the need to remember user IDs and passwords for so many different applications, services, and other resources, each of which has
4/33 requirements for the way in which such IDs and passwords are to be constructed, can be somewhat intimidating and users often forget their IDs and passwords for resources that they do not frequently access. Another disadvantage is the fact that there are security concerns with using numeric / text strings to control access to resources. For example, there are powerful software programs that can be used to hack these strings to gain unauthorized access to resources.
[007] In addition, the common contact-based method of a user using their fingers to enter passwords and user IDs on the screen of the smart phone lends itself to security risks. Experienced hackers often have the ability to steal the fingerprint pattern from the screen based on the oil residue left by the finger to gain unauthorized access. This is particularly evident in the context of inserting a short numeric character string such as a four-digit number to unlock the device. Once the device is unlocked, many of the features on the device may not even be secure which creates serious security risks.
[008] A solution that was aimed at eliminating or reducing the disadvantages discussed above involves the use of biometric technology to control access to resources available through mobile devices. Although these solutions have, in some cases, eliminated some of the disadvantages discussed above, they still have numerous disadvantages. For example, some contact-based solutions ask a user to place their finger on the
5/33 device sensor, which has the ability to capture the user's fingerprint, which is then compared with the fingerprint data located remotely or locally to determine if there is sufficient compatibility to allow the user or device to access a or more features. In that case, as mentioned above, a fingerprint can be stolen from the device's sensor by a hacker and used to gain unauthorized access to one or more resources at a later time through the use of such stolen fingerprint. These solutions also typically have the disadvantage that the time to perform the processing required to determine if the fingerprint is compatible can be unacceptable in the context of a busy user trying to gain access to several different features on the device over the course of an average day.
[009] There are additional health problems associated with contact-based methods that involve the transmission of germs, viruses, or other biological hazards, particularly in the case of devices shared between users. As is known in the art, a person's fingertip, and a person's hands more generally, are often one of the primary means of transferring germs, viruses, or other biological hazards between people. In the case of individual devices that are shared between multiple people, contact based authentication and identification methods in which a user types a fingerprint identification string, or authenticates or identifies it using methods based on in biometric contact, such
6/33 such as fingerprint recognition or palm print, among others, creates the risk of transferring said biological risks through shared contact.
SUMMARY OF THE INVENTION [010] The invention then aims to provide a system and methodology based on the absence of biometric contact that supports accurate, secure and fast authentication and / or identification of users and devices to provide selective access to accessible resources through camera-equipped devices.
[011] In an embodiment of the present invention, users of such devices equipped with a camera (hereinafter referred to eventually as smart phones for the sake of convenience, although the devices should be understood as including all devices with the presence of a camera, which includes both mobile devices and stationary devices, such as desktop computers) which are required to identify or authenticate them as a condition for gaining access to one or more resources by taking one or a series of palm photos hand or both of their palms using the smart phone camera. Subsequently, the system of the present invention employs computer vision technology to analyze the image of the palm print and or verify that the palm print signature is compatible with the user model in a database (authentication or find the compatible user model from among
7/33 several models in a database (user identification).
[012] Attributes and additional aspects of the present invention will become evident from the following detailed description of exemplary modalities together with the reference to the attached Figures.
BRIEF DESCRIPTION OF THE DRAWINGS [013] Figure 1 is a diagram representing the main components of the system of the present invention in a preferred embodiment thereof;
[014] Figure 2 is a block diagram, which is useful for illustrating the methodology of the present invention in a preferred embodiment;
[015] Figure 3 is a diagram illustrating secure connectivity between a mobile device and one or more remote servers in accordance with a preferred embodiment of the present invention;
[016] Figure 4 is a flowchart that illustrates the main steps for authenticating a user or device according to the present invention in a preferred embodiment thereof; and [017] Figure 5 is a flowchart that illustrates the main steps for identifying a user or device according to the present invention in a preferred embodiment.
DETAILED DESCRIPTION OF THE EXEMPLIFICATIVE MODALITIES [018] Now reference will be made in detail to several exemplary modalities of the invention. It should be understood that the discussion of exemplary modalities below is not intended to limit the invention, as
8/33 widely disclosed in this document. Instead, the following discussion is provided in order to provide the reader with a more detailed understanding of certain aspects and attributes of the invention.
[019] Before the modalities of the present invention are described in detail, it should be understood that the terminology used in this document is intended to describe exclusively particular modalities, and is not intended to be limiting. Unless otherwise defined, all technical terms used in this document have the same meaning as commonly understood by an individual of ordinary skill in the technique to which the term belongs. While any methods and materials similar or equivalent to those described herein can be used in the practice of the present invention, the preferred methods and materials will now be described. All publications mentioned in this document are incorporated by reference in this document to reveal and describe the methods and / or materials together with which the publications are cited. The present disclosure will prevail to the extent that it conflicts with any incorporated publication.
[020] As used in this document and the appended claims, singular forms one and o include references in the plural unless the context clearly dictates otherwise. Thus, for example, a reference to a palm includes a single palm or both the palms of an individual and reference to an image includes reference to one or more images. In addition, the use of terms that can be
9/33 described using equivalent terms includes the use of such equivalent terms. Thus, for example, the use of the term camera must be understood as it includes any device with the ability to obtain an image of an object. As another example, and as mentioned above, the term smart phone includes all devices with a camera capability.
[021] Now follows a description of the present invention in preferred embodiments thereof. Referring to Figure 1, a discussion now follows of the key components of the system of the present invention, as well as the context in which each of these components interacts with each other to derive the advantages of the present invention. Device 100 can be any device that contains a camera with the ability to take high quality photographs. Preferably, the device 100 camera also contains a flash element with the ability to selectively and quickly activate and deactivate to illuminate the area to be photographed. Examples of such devices 100 include smart phones, tablet computers, notebook computers, and any other device that can be carried by a user and that provides a computing platform that allows the functionality of the present invention to be operational, as well as desktop computers. or a variety of built-in stationary devices. Examples of such stationary embedded devices include camera equipment attached to entrances to facilities or other strategic locations that provide secure access to physical spaces or other resources, or camera equipment attached to
10/33 strategic locations for such purposes as time and attendance protocols, as well as other applications. Although not required, the device 100 may also contain several other features, such as a display screen (which can also be a touch screen), a key pad, an accelerometer, GPS capabilities, storage capacity and a central processing unit (CPU).
[022] Device 100 includes at least one camera 105, which preferably has the capacity to produce high quality photographs, for example, of two megapixels or more, such as four megapixels, six megapixels, or eight megapixels. The data processor of camera 110 receives image data from camera 105 and processes it as known in the art to create pixel data representative of the photograph, which can be used in a variety of ways, including the purposes outlined together with the present invention as described now. The data from the data processor of camera 110 is fed to the Region of Interest Detector 115, which aims to locate the palm area off the image, and to delineate the area with a high level of accuracy and consistency. , such as to provide palm masks with substantially the same shape and position on the palm through a variety of independent images with different lighting conditions and palm orientations for the camera.
[023] In a Region of Interest Detector 115 mode, the region of interest is detected using the
11/33 using local classifiers based on a sliding window to label the pixels that are part of the palm and those that are not through the classification points, followed by a segmentation step to group neighboring pixels of the palm into components connected to the input image. A high level of accuracy and robustness for image noise can be achieved since a significant number of local discriminating attributes are learned from a wide collection of exemplary images to capture the stable characteristics of the appearance of the palm to form strong classifiers. As a result, the trained detector can precisely locate and delineate the region (s) of interest in input images taken freely with varying hand orientations and lighting conditions.
[024] In a Region of Interest Detector 115 modality, local classifiers based on Haar and AdaBoost (reference 1) are used to detect the region of interest in the palm area of a user's hand. In another modality of Region of Interest Detector 115, local classifiers based on support vector machines (reference 2) are used to detect the region of interest in the palm area of a user's hand. In another modality of the Region of Interest Detector 115, a convolutional neural network is used to detect the region of interest in the palm area of a user's hand, such as those described in Patent Applications - US 5,067,164 and 5,058 .179, and in (references 3 and 4).
[025] The Region of Interest Detector 115 then
12/33 feeds the image data, which includes the palm area mask, to the Conversion Processor 120, which serves to extract a signature 125 from portions of the image that represent the characteristic attributes of the palm area of the individual's hand that can be used to distinguish the individual from another user, in which said sections are small sample windows in the mask of the palm area.
[026] In one modality, Signature 125 is a vector calculated as follows. First, a histogram of edge orientations is calculated in a number of well-chosen regions in the image. What can be done using one of the well-known computer vision methods for local extraction of image descriptors, such as the SIFT Transform (Scale Invariant Feature Transform) (see, for example, reference 5), Histogram Oriented Gradients HOG (Histogram of Oriented Gradients) (see, for example, reference 6), and other references known in the art. Second, each orientation histogram is compared with numerous prototypes that have been calculated from training data, for example, using the well-known K-means clustering algorithm. Finally, the signature vector is formed so that the k component of the vector corresponds to the k-th prototype mentioned above. The k component contains the number of regions for which the histogram was closer to the k prototype than to all other prototypes. This sequence of operations is known in the literature as an attribute set representation (see reference 7, for example). Must be
13/33 It is evident from current teachings that, in another embodiment of the present invention, multiple Attribute Sets can be used to preserve the geometric relationship between local regions.
[027] Signature 125 is then fed to the AID Identification and Authentication Mechanism (AID Mechanism) 130, which serves to implement many of the main processes of the present invention as hereinafter described. The AID 130 Engine communicates with the User Model Database 135 which, if present, stores a local copy of a user model. Thus, in the case of applications or services that reside locally on Device 100 and do not request external communication with, for example, remote servers or remote devices, a resulting user signature from palm print images taken through the camera 105 can be compared with the known user model (s) stored in the User Model Database 135 for authentication or identification. User models are statistical models calculated from a collection of images of an individual's palm, with signatures derived from such images that define the model. In one embodiment, the user model consists of a so-called Gaussian density model of signatures calculated from the user's reference images. Given the signature of the consulted image S, the user model is used to compute a compatibility score. The signature is considered to be compatible with the user model if the compatibility score
14/33
where Mi and Vi are the medium and variation of the i-th component of the signature vectors of all reference images of the given user, and u is a small constant. The signature is considered compatible with the user model if the compatibility score R is greater than a pre-selected threshold for that user model. The Identification and Authentication Mechanism 130, the Model Assembly Mechanism 155 and the User Model Database 135 form an AID 160 Unit.
[028] Signature 125 is also fed to the Model Assembly Mechanism 155 to start the user model during the first time of user registration or selectively incorporate the signature information to refine the user model stored in the Model Database User 135 if the model is already present. In one embodiment of the present invention, the Model Assembly Mechanism 155 updates the aforementioned Mi and Vi media and variations through the use of the signature extracted from the user's new images.
[029] Device 100 preferably also contains a Remote Resource Interface 145, which communicates with the AID 130 Mechanism. Remote Resource Interface 145 serves as the interface between the authentication and identification features deployed on device 100 and these same features as they occur on external / remote resources, such as remote servers and remote devices. In this way, for example, Remote Resource Interface 145 interacts with applications
15/33 resident on remote servers to coordinate authentication or identification as required by remote applications. This may include managing and responding to requests for external resources for authenticating or identifying a user operating the Device 100 or for authenticating or identifying the Device 100 itself.
[030] Remote Resource Interface 145 can communicate with Network Interface 150 to transmit and receive data in conjunction with authentication and identification activities. Several wireless communication protocols can be used, which include radio frequency, as well as others, including, but not limited to, Bluetooth and other proximity field communications technologies. In a preferred embodiment of the present invention, the data sent and received from Device 100 over the open wireless link is ensured as is known in the art through, for example, encryption and / or other methodologies, which reduce or eliminate the possibility that user data and other data associated with the authentication and identification methodologies of the present invention may be intercepted by unauthorized parties. Network Interface 150 typically comprises a radio frequency transceiver module as is known in the art and allows Device 100 to communicate wirelessly with Wireless Network 400. Wireless Network 400, in turn typically transfers back data that is transmitted by or to be received by Device 100 to Data Network 500, again as is known in the art.
16/33 [031] By way of example only, the present invention allows users of Device 100 or Device 100 itself to be authenticated or identified by remote servers and applications and other resources residing on remote servers. As illustrated in Figure 1, Remote Server 200 can communicate with Device 100 via the communication path discussed above. In this way and as controlled by the Remote Resource Interface 145 that resides on Device 100, the AID Unit 205 that resides on Remote Server 200 can request and receive authentication and data identification from Device 100 to compare with known user models and validated residing on remote server 200 or accessible by it as more fully described below. This ability for authentication and identification provides selective access to one or more applications 210, data 215 and other resources residing on remote server 200. The same capacity can also provide selective access to Local Resources 140, which include applications, data and / or other resources that reside on Device 100, as well as cases where such local resources seek access to data or other resources that are remote to device 100.
[032] In another embodiment of the present invention, communication, as discussed above, can occur between Device 100 and one or more Remote Devices 300. Remote Devices 300 can be the same or different types of device such as Device 100 and the authentication / identification functionality in accordance with the teachings of the present invention can occur for
17/33 both sides. In other words, Device 100 can respond to authentication / identification requests from Remote Device 300 in order to access, for example, one or more Applications 310 and / or Data 315 residing on Remote Device 300 through AID Unit 305 on Remote Device 300. But also, Remote Device 300 can receive and respond to authentication and identification requests initiated by Device 100 so that Remote Device 300 (or a user who operates it) accesses the resources residing on Device 100. In some cases, both Device 100 and Remote Device 300 will request authentication and / or identification of the other before resources are shared. This can occur, for example, in the context of secure communication
desired among users Device 100 and O Remote Device 300. [033] Now, regarding Figure 2, will be described The authentication methodology and / or identification in
user / device according to a preferred embodiment of the present invention. Through the initial discussion, the difference between authentication and identification in the context of the teachings of the present invention will be described first.
[034] In the case of authentication, the user presents an identity in the form of a user ID or user name and the system of the present invention verifies whether the user is, in fact, who he or she claims to be. The system then compares the signature derived from an image or images of the user's palm with the corresponding model in the user model database. If they are compatible, the
18/33 user is authenticated. If they are not compatible, the user is rejected.
[035] The flowchart for user authentication according to the teachings of the present invention, in a preferred mode, is shown in Figure 4. As a first step, the user of Device 100 can enter his name or other identification information on the Device 100, or the user's identity may already have been previously loaded on Device 100. Separately, the user takes a portrait or set of portraits from the user's palm or hands using Camera 105 from Device 100. Then , the Camera 110 Data Processor sends the raw pixel data to the Region of Interest Detector 115, which determines the palm area in the image. The masked area of the palm from the Region of Interest Detector 115 is fed to the Conversion Processor 120, which derives the user's unique signature. This conversion function can be processed, alternatively, in a remote resource or, partially in a remote resource and partially in Device 100. Without any direct contact between the represented palm area and Device 100, through the use of images of high resolution hand drawn freely and in any orientation by the end user without any special hardware other than a common digital camera, the system of the present invention identifies the individual through the use of a multi-step software solution that involves the extraction of attribute, attribute processing in user signatures, and the compatibility of user signatures with signatures
19/33 user or stored user models, in which: (i) a single or multiple regions of interest are detected and segmented from the input image to remove extraneous pixel data and isolate the palm area; (ii) numerous sparse high dimensional attribute vectors are extracted from the image (see, for example, reference 8); (iii) a single signature is created for each image in a process in which the next attribute vectors are grouped into a more compact and robust image representation; and (iv) multiple image signatures are combined into one identity model for each individual user.
[036] The Identification and Authentication Mechanism 130 then searches for the user model (based on the user identifier data previously presented) in the User Model Database 135. At that moment, if the derived user signature is compatible with the stored user model, then the user is authenticated and access to the desired feature or feature set is allowed. Alternatively, if the user signature and model are not compatible, then access to the desired feature or feature set is denied to the user. The aforementioned functionality in relation to search and comparison can be performed alternatively remotely from Device 100.
[037] In case of identification, the user only presents an image of the palm print or set of images and the Identification and Authentication Mechanism 130 compares the signature derived from the image or images of the palm print with all the
20/33 models or a subset of models in the User Model Database 135. If a match is found, then the user is identified. If no compatibility is found, the user is characterized as unknown.
[038] The flowchart for user identification is shown in Figure 5. In this case, as in the case of authentication, the user takes a picture or set of pictures from the user's palm. This data is processed again in pixel form by the Camera 110 Data Processor and sent to the Region of Interest Detector 115 to determine the palm area in the image. The masked area of the palm from the Region of Interest Detector 115 is fed to the Conversion Processor 120, which derives the user's unique signature and then the AID 130 Engine compares the derived signature with all models or a subset of models stored in the User Models Database 135. The conversion and comparison functions mentioned above could alternatively be processed on a remote resource or, partially on a remote resource and partially on Device 100. In any case, if a compatibility, then the user is identified and access to a resource or set of resources can be allowed. If no compatibility is found, then the user cannot be identified and access to a desired resource or set of resources will not be allowed.
[039] Which mode (authentication or identification) will be used, will depend on the application. Typically, authentication
21/33 provides a higher level of precision, but a lower level of user experience, due to the extra step that a user needs to go through to insert an additional factor of the user's identity. The second identity factor can take any of the common forms, such as a username, user ID, password, unique employee ID, social identification number, email address, a variety of other biometric modalities, among others. In a modality of the present invention, the second identity factor is the signature derived from the image (s) of the palm print of the individual's second hand, with the individual signatures of each of both images of the palm print or image sets of the individual used together for authentication or identification.
[040] It is important to note that in each case described above (authentication or identification), instead of comparing a user signature with a template in the User Template Database 135 located locally on Device 100, a signature generated by an image or set of images of a user's palm drawn on the Device 100 could be compared to a model or models contained in a database located in either one or both of the Remote Server 200 or one or more Remote Devices 300. In that In this case, the Device 100 user would typically be looking for access to one or more resources residing on these remote platforms instead of a resource located locally on the Device 100. For example, in the case of unlocking, for example, a smart phone , processing could be
22/33 done locally on the smart phone / Device 100. While, if authentication is done, for example, in conjunction with a remote based application, some parts of the processing could be done on a Remote Server 200 with user models a compared to those possibly stored on the Remote Server 200, as opposed to stored locally on the smart phone. Additionally, it must be evident from the present teachings that user models, signatures and / or other biometric data can be synchronized between any of the AID Units 160, 205, 305 to allow authentication or local identification in any of the the Device 100, the Remote Server 200, the Remote Device 300 without said Device 100, the Remote Server 200 or the Remote Device 300 that has generated said user model, signature and / or other biometric data locally.
[041] Back to Figure 2, it can be seen that in a preferred embodiment of the present invention, in step (1), Device 100 is used to take a portrait or series of portraits from the user's palm to be identified ( step (2)) with Camera 105 (step (3)). A flash component (step (4)) can be embedded in Device 100 to provide the necessary pre-processing of the image, particularly as it relates to providing minimal enough illumination for the detection of the region of interest, attribute extraction, and signature processing of an individual's palm image. Then, the palm region of the image is masked by the Region of Interest Detector 115 (step (5)) and
23/33 fed to the Conversion Processor 120 (step (6)) to convert raw pixels into a unique identifying user signature, Signature 125. The user signature is a compact code that contains relevant identification information associated with the print image from the user's palm and can be quickly and accurately compared to a large database of user templates such as the User Template Database 135 or a database on a remote platform (step (7) ). A benefit of the derived user signature is that it makes it essentially impossible to reconstruct an image of the user's palm from a user model database. In step (8), the AID Engine 130 compares the user signature from the image or set of images on the palm of the hand with those in the user model database to identify or authenticate the user as applicable. The conversion and comparison functions mentioned above could alternatively be processed on a remote resource or, partially on a remote resource and partially on Device 100.
[042] Now, in relation to Figure 3, it can be observed that, in cases where authentication or identification is performed in relation to a remote resource, the communication between Device 100 and such remote resource occurs, preferably, in a secure connection. As is known in the art. This may involve one or more techniques, as is known in the art, to include, for example, strong encryption, public or private key encryption, digital certificates and / or digital signatures, among others.
24/33 [043] Now that the primary system and methodologies of the present invention have been described, new additional attributes, such as several methodologies for preventing identity mystification together with authentication / identification, as well as a new methodology
to code and to exchange information in transaction with resources remote, will be discussed. [044] The protection against mystification in identity it is a aspect important this invention. THE same prevents what
opponents, for example, use a printed photograph of a palm instead of a real hand for authentication. A new aspect of the present invention that addresses protection against identity mystification involves the detection and use of the three dimensional characteristics of a human hand in order to provide security against identity mystification.
[045] In an example of identity mystification detection, in order to distinguish between a photograph and a real hand, the system of the present invention takes a series of portraits in rapid sequence, in which the camera flash is used intermittently and at varying time periods. Portraits of a three-dimensional object (a real hand) taken with the flash will have certain highlighted regions and shadows created by the flash, while in hand positions in which a two-dimensional representation of the hand (for example, a printed photograph of a palm or a palm image displayed on the screen display of another mobile device) would not display such highlighted regions and shadows. This allows the system of the present invention to use a comparison of the regions
25/33 highlights and hand-created shadows between photos with and without flash to distinguish between a printed photo and a real hand. As such, an unauthorized group that may have obtained a palm portrait of the authorized user cannot use that portrait to gain unauthorized access to local or remote resources.
[046] Additional methods for detecting a real hand include three-dimensional modeling of the hand. In that case, the system of the present invention can instruct the user to turn their hand over while taking a series of multiple portraits. A true three-dimensional object will reveal different parts of the hand with each successive image while a two-dimensional object will always show exactly the same part of the hand, with only varying degrees of distortion. This allows the system of the present invention to distinguish between a printed photograph and a real hand. Similarly, instead of rotating the hand, the user may be instructed to close the hand in a fist or open it from a fist while the series of portraits is taken. Other methods of distinguishing an actual hand from the photograph of a hand are also possible.
[047] Another new aspect of the present invention is a methodology in which attempts at reproduction can be detected and prevented. In this case, an opponent modifies a mobile device so that it sends one or a series of previously recorded portraits from the real hand of a legitimate user to the network for the purpose of authenticating or identifying rather than sending the images photographed by the camera. It is assumed here that the opponent could take pictures of an authorized user's hand without the user
26/33 authorized has knowledge or can prevent the same. If this is a real risk (for example, a case where an authorized user is asleep or unconscious), then it is preferable that the system is used in such a way that one or more additional identity factors, such as a user ID or other form of data independent of the palm print image, are requested to authenticate a user.
[048] To detect and defend against an attempted reproduction, the system of the present invention sends a series of portraits and flashes at a variety of intervals, that is, it records a series of portraits, some with the flash off and others with the flash on. Specific portraits and flash sequence on / off can be chosen at random or according to a predetermined sequence and can change for each authentication or identification request. The system of the present invention can easily detect whether an opponent uses a series of portraits previously recorded because the pattern of portraits and flashes on / off will not be compatible with one actually sent to the mobile device.
[049] Another method for detecting an attempted reproduction involves storing all previously used images and comparing the new images with those in the database. Due to the pixel data underlying the images of two different palms, essentially they can never be exactly the same or substantially the same for a given tolerance level, the system can detect when an image previously taken is used again. Other methods for detecting
27/33 a reproduction attempt are also conceivable.
[050] Yet another new aspect of the present invention is the ability to embed transaction information or other data over time in a series of photographs and / or flash patterns. This time pattern can be used additionally to encode information about the transaction itself. A cryptographic hash code can then be applied to that information. The hash code makes the resulting code compact (short) and also prevents anyone who observes the flash pattern from deriving any information about the original content of the code. In one embodiment of the present invention, the image sequence time and / or flash patterns are used as part of a mechanism against identity mystification to determine whether the image sequence provided for authentication or identification is compatible with the information itself transaction. A specific deployment can include:
1. A low resolution video of the palm area with the flash pattern.
2. One or more high resolution still images of the palm area.
3. Computer vision technology to ensure that the high resolution image (s) are (are) from the same object as the one in the video.
[051] Based on the previous description of the system and methodologies of the present invention, it can be understood that several applications are possible. Examples include, but are not limited to, access to one or more devices, access to one or more applications residing on those devices or located remotely on a server or
28/33 on other remote devices, a variety of transactional applications (such as electoral voting, social welfare promotion, financial payments), and any other type of transaction that requires user identity validation.
[052] In summary, in exemplary embodiments, the present invention provides computer systems (which include a combination of software operating on suitable hardware), computer-implemented methods and devices for the authentication or identification of an individual that include the use of an image or a set of images of an individual's palm through a process that involves the following steps: (1) detecting the palm area using local classifiers; (2) extracting attributes from the region (s) of interest; and (3) compute the compatibility score in comparison with the user models stored in a database, which can be dynamically increased through a learning process. Accordingly, the invention includes a system for providing selective access to resources available in conjunction with a device comprising software running on suitable computer hardware, in which the system comprises: (a) at least one camera associated with said device, wherein said camera is capable of taking at least one photograph of an impression of the human palm; (b) a detector module that uses local classifiers to locate and segment the palm region of interest without physical contact; (c) a conversion processor which converts raw pixel data associated with said region of interest from a
29/33 impression of the human palm in a single signature associated with said impression of the palm; and (d) an identification and authentication mechanism, in which said identification and authentication mechanism determines whether access to one or more of said resources should be granted based on said single signature and at least one database containing one variety of user models. The system may additionally comprise a learning processor that enhances user models with new data, wherein the learning processor selectively includes said palm print image to increase said database and said identification and authentication. In the embodiments, the device is a mobile device, although in other embodiments, the device is a desktop device or a stationary built-in device. The system may include a flash component that is selectively activated at the time of image capture to provide minimal enough illumination for the detection of the region of interest, attribute extraction and image signature processing from the human palm. In the modalities, the system's conversion processor uses descriptors extracted from excerpts about the region of interest. The descriptors can be encoded in sparse high dimensional vectors, which can be grouped into at least one group.
[053] The system of the invention may have, as part of the method implanted in the system, the attribute of computing a signature from an Attribute Set or multiple representations of Attribute Sets. Beyond
In addition, the system's detector module can use Haar Wavelets and AdaBoost algorithms. In several modalities, the system includes a detector module that uses support vector machines or a convolutional neural network. The system user module can be a statistical model calculated from a collection of images of the human palm. Similarly, the user model can be a Gaussian density model or a mixture of Gaussian density models.
[054] The system of the invention can be configured so that at least one of the resources is remote from the device. Alternatively, at least one of the resources can be resident on the device. In some modalities, at least one of the resources is an application or a database.
[055] In embodiments of the system of the invention, the individual signatures of each of both images of a human's palm print, if available, are used together for the authentication or identification of the human.
[056] In some embodiments of the system of the invention, the authentication or identification of the palm print occurs in conjunction with other modalities, such as one or more of the following: passwords, security questions, fingerprint recognition, facial recognition, iris recognition, written signature recognition, and other biometric and non-biometric modalities.
[057] The system of the invention can be implemented in such a way that an application selectively allows one or
31/33 more users conduct one or more transactions.
[058] The system of the invention can also include the use of a sequence of images with and without flash of the human palm, which can be used, among other purposes, as part of a mechanism against the mystification of identity to determine whether the hand shown is a three-dimensional object or a two-dimensional representation of a hand. In addition, the system of the invention can be deployed in such a way that image data captured during the movement of the human palm is used as part of a mechanism against identity mystification to determine whether the hand presented is a three-dimensional object or a representation two-dimensional hand. In some modalities, the sequence of images with and without flash of the human palm, as well as the time interval (s) between successive images are used as part of a mechanism against the mystification of identity to determine whether an opponent is trying to use a previously recorded sequence of images for authentication or identification.
[059] In some embodiments of the invention, all previously used images of a human are stored, such as in a database residing on a computing device, to compare with new images as part of a mechanism against mystifying identity for determine if an opponent is trying to use previously recorded images for authentication or identification. Moreover, in certain modalities, the system of the invention is implemented so that transaction information, or other data, is embedded in the
32/33 a sequence of images and / or flash patterns as part of a mechanism against mystifying identity to determine whether the sequence of images provided for authentication or identification is compatible with the information in the transaction itself.
[060] Although particular embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that, based on the teachings of this document, changes and modifications can be made without departing from this invention and the broader aspects of it .
REFERENCES CITED (1) Paul Viola and Michael Jones, Rapid Object Detection using a Boosted Cascade of Simple Features, Proceedings of IEEE Computer Vision and Pattern Recognition, 2001, pages 1: 511 to 518.
(2) Corinna Cortes and Vladimir N.Vapnik, SupportVector Networks, Machine Learning, 20, 1995.
(3) Yann LeCun, Leon Bottou, Yoshua Bengio, Patrick Haffner: Gradient-Based Learning Applied to Document Recognition, Proceedings of the IEEE, 86 (11): 2278 to 2324, November 1998.
(4) Pierre Sermanet, Koray Kavukcuoglu, Soumith Chintala and Yann LeCun: Pedestrian Detection with Unsupervised Multi-Stage Feature Learning, Proc. International Conference on Computer Vision and Pattern Recognition (CVPR'13), IEEE, June 2013.
(5) David G. Lowe, Distinctive image features from scale-invariant keypoints, International Journal of Computer Vision, 60, 2 (2004), pages 91 to 110.
33/33 (6) N. Dalai and B. Triggs. Histograms of oriented gradients for human detection. In Proceedings of Computer Vision and Pattern Recognition, 2005.
(7) Y-Lan Boureau, Jean Ponce and Yann LeCun: A theoretical analysis of feature pooling in vision algorithms, Proc. International Conference on Machine learning (ICML'10), 2010.
(8) Yann LeCun, Koray Kavukvuoglu and Clement Farabet: Convolutional Networks and Applications in Vision, Proc. International Symposium on Circuits and Systems (ISCAS'10), IEEE, 2010.
权利要求:
Claims (6)
[1]
1. System to provide selective access to resources available in conjunction with a device that comprises software running on suitable computer hardware, CHARACTERIZED by the fact that said system comprises:
(a) at least one camera associated with said device, wherein said camera is capable of taking at least one photograph of an impression of the human palm;
(b) a detector module that uses local classifiers to locate and segment the palm region of interest without physical contact;
(c) a conversion processor which converts raw pixel data associated with said region of interest from a human palm print into a single signature associated with said palm print; and (d) an identification and authentication mechanism, wherein said identification and authentication mechanism determines whether access to one or more of said resources should be permitted based on said unique signature and at least one database containing one variety of user models.
[2]
2/6
2. System, according to claim 1, CHARACTERIZED by the fact that it additionally comprises a learning processor that enhances user models with new data, in which the learning processor selectively includes said image of the palm print hand to increase said database and said identification and authentication mechanism.
[3]
3/6 Attribute Set representations.
11. System, according to claim 1, CHARACTERIZED by the fact that the detector module uses Haar Wavelets and AdaBoost algorithms.
12. System, according to claim 1, CHARACTERIZED by the fact that the detector module uses support vector machines.
13. System, according to claim 1, CHARACTERIZED by the fact that the detector module uses a convolutional neural network.
14. System, according to claim 1, CHARACTERIZED by the fact that the user model is a statistical model computed from a collection of images of a human's palm.
15. System, according to claim 1, CHARACTERIZED by the fact that the user model is a model of Gaussian density.
16. System, in wake up with The claim 1, CHARACTERIZED BY fact that the model of user is an mix of models density Gaussian • 17. System, in wake up with The claim 1, CHARACTERIZED BY fact that i At least one of the features is remote from device. 18. System, in wake up with The claim 1, CHARACTERIZED BY fact that fur least one of the features
resides on the device.
19. System according to claim 1,
FEATURE ZADO by the fact that said at least one of the features is an application.
3. System, according to claim 1, CHARACTERIZED by the fact that said device is a mobile device.
[4]
4/6
20. System, in wake up with the claim 1, CHARACTERIZED BY fact that the said at least one From resources is a foundation of data. 21. System, in wake up with the claim 1, CHARACTERIZED BY fact that individual signatures
of each one of both images of the human palm print, if available, are used together for the authentication or identification of the human.
22. System, according to claim 1, CHARACTERIZED by the fact that authentication or
print identification gives palm gives hand is combined with other modalities. 23. System, according with The claim 21, CHARACTERIZED BY fact in what at other modalities include one or more among the next : passwords, questions
security, fingerprint recognition, facial recognition, iris recognition, written signature recognition, and other biometric and non-biometric modalities.
24. System, according to claim 1, CHARACTERIZED by the fact that an application selectively allows one or more users to conduct one or more transactions.
25. System, according to claim 1, CHARACTERIZED by the fact that a sequence of images with and without flash of the human palm are used as part of a mechanism against the mystification of identity to determine whether the presented hand is an object three-dimensional or two-dimensional representation of a
4. System, according to claim 1, CHARACTERIZED by the fact that said device is a table device.
[5]
5/6 hand.
26. System, according to claim 1, CHARACTERIZED by the fact that the image data captured during the movement of the human palm is used as part of a mechanism against the mystification of identity to determine whether the presented hand is an object three-dimensional or two-dimensional representation of a hand.
27. System, according to claim 1, CHARACTERIZED by the fact that the sequence of images with and without flash of the human palm, as well as the time interval (s) between successive images, are used as part a mechanism against identity mystification to determine if an opponent is trying to use a previously recorded sequence of images for authentication or identification.
28. System according to claim 1, CHARACTERIZED by the fact that all previously used images of a human are stored for comparison to new images as part of a mechanism against identity mystification to determine if an opponent is trying to use images previously recorded for authentication or identification.
29. System according to claim 1, CHARACTERIZED by the fact that transaction information, or other data, is embedded within a sequence of images and / or flash patterns as part of a mechanism against mystification of the identity to determine whether the sequence of images provided for the
5. System, according to claim 1, CHARACTERIZED by the fact that said device is a stationary embedded device.
6. System, according to claim 1, CHARACTERIZED by the fact that said device includes a flash component that is selectively activated at the moment of image capture to provide minimal enough illumination for the detection of the region of interest, attribute extraction, and signature processing of the image of the human palm.
7. System, according to claim 1, CHARACTERIZED by the fact that the conversion processor uses descriptors extracted from excerpts about the region of interest.
8. System, according to claim 1, CHARACTERIZED by the fact that the descriptors are encoded in sparse high dimensional vectors.
9. System, according to claim 1, CHARACTERIZED by the fact that the sparse vectors are grouped into at least one group.
10. System, according to claim 1, CHARACTERIZED by the fact that the signature is computed from an Attribute Set or multiple
[6]
6/6 authentication or identification is compatible with information from the transaction itself.
类似技术:
公开号 | 公开日 | 专利标题
AU2019203766B2|2021-05-13|System and method for biometric authentication in connection with camera-equipped devices
US10664581B2|2020-05-26|Biometric-based authentication method, apparatus and system
Jain et al.2016|50 years of biometric research: Accomplishments, challenges, and opportunities
Dinca et al.2017|The fall of one, the rise of many: a survey on multi-biometric fusion methods
US20050220326A1|2005-10-06|Mobile identification system and method
Akhtar et al.2016|Face spoof attack recognition using discriminative image patches
Barra et al.2013|Fame: face authentication for mobile encounter
Fenu et al.2018|Controlling user access to cloud-connected mobile applications by means of biometrics
CN107395369B|2021-03-02|Authentication method, access method and system for self-contained equipment of mobile Internet
US10867022B2|2020-12-15|Method and apparatus for providing authentication using voice and facial data
TWI727329B|2021-05-11|Anti-spoofing system and method for providing selective access to resources based on a deep learning method
Hamdan et al.2018|A self-immune to 3D masks attacks face recognition system
Choudhury2020|Biometrics security based on face recognition
Lin et al.2011|Biometric authentication
RAKESH et al.2015|Image Quality Assessment for Fake Biometric Detection: Application to Iris, Fingerprint, and Face Recognition
Ara et al.0|An Efficient Privacy-Preserving User Authentication Scheme Using Image Processing and Blockchain Technologies
Srividhya0|An Android Based Secure Access Control Using Raspberry and Cloud Computing Technique
同族专利:
公开号 | 公开日
ZA201502207B|2016-06-29|
WO2014039732A2|2014-03-13|
KR20150079583A|2015-07-08|
PH12015500481A1|2015-04-27|
IL237556A|2020-03-31|
US20140366113A1|2014-12-11|
EP2893489A2|2015-07-15|
US20140068740A1|2014-03-06|
US10135815B2|2018-11-20|
EP3657389A1|2020-05-27|
EP2893489B1|2020-02-19|
MX346218B|2017-03-09|
EP2893489A4|2016-04-20|
JP2020061171A|2020-04-16|
JP6634127B2|2020-01-22|
PH12021551836A1|2022-01-10|
IL272998A|2021-06-30|
US20190124079A1|2019-04-25|
US10728242B2|2020-07-28|
ES2791776T3|2020-11-05|
AP2015008348A0|2015-04-30|
JP2015529365A|2015-10-05|
MY181564A|2020-12-29|
AU2013312495A1|2015-04-23|
KR101938033B1|2019-01-11|
EA201590485A1|2015-12-30|
SG11201501691VA|2015-04-29|
AU2013312495B2|2019-03-21|
CN104756135B|2018-11-23|
CA2884096A1|2014-03-13|
MX2015002841A|2015-10-29|
HK1212494A1|2016-06-10|
AU2019203766A1|2019-06-20|
JP2018200716A|2018-12-20|
IL272998D0|2020-04-30|
CA2884096C|2021-01-26|
PH12015500481B1|2015-04-27|
WO2014039732A3|2014-05-08|
CN104756135A|2015-07-01|
AU2019203766B2|2021-05-13|
IL237556D0|2015-04-30|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US3733861A|1972-01-19|1973-05-22|Recognition Devices|Electronic recognition door lock|
US4371951A|1980-09-29|1983-02-01|Control Data Corporation|Apparatus for converting serial input sparse vector format to parallel unpacked format for input to tandem arithmetic logic units|
US5067164A|1989-11-30|1991-11-19|At&T Bell Laboratories|Hierarchical constrained automatic learning neural network for character recognition|
US5058179A|1990-01-31|1991-10-15|At&T Bell Laboratories|Hierarchical constrained automatic learning network for character recognition|
US5450523A|1990-11-15|1995-09-12|Matsushita Electric Industrial Co., Ltd.|Training module for estimating mixture Gaussian densities for speech unit models in speech recognition systems|
US5774059A|1995-07-20|1998-06-30|Vindicator Corporation|Programmable electronic lock|
CA2156236C|1995-08-16|1999-07-20|Stephen J. Borza|Biometrically secured control system for preventing the unauthorized use of a vehicle|
JP2815045B2|1996-12-16|1998-10-27|日本電気株式会社|Image feature extraction device, image feature analysis device, and image matching system|
US6178255B1|1998-04-28|2001-01-23|Cross Match Technologies, Inc.|Individualized fingerprint scanner|
DE60008807T2|1999-03-24|2005-01-13|Tosoh Corp., Shinnanyo|Binaphthol monophosphoric acid derivatives and their use|
JP4727065B2|2000-05-11|2011-07-20|株式会社半導体エネルギー研究所|Authentication apparatus and communication system|
US6956608B1|2000-08-11|2005-10-18|Identix Incorporated|Fingerprint imaging device including an optical plate having microreflectors|
US6819219B1|2000-10-13|2004-11-16|International Business Machines Corporation|Method for biometric-based authentication in wireless communication for access control|
JP2002259345A|2001-02-27|2002-09-13|Nec Corp|Method/device for authentication for preventing unauthorized use of physical feature data, and program|
US6633090B2|2001-09-07|2003-10-14|Delphi Technologies, Inc.|Starting system for an automotive vehicle using fingerprint recognition|
JP2003148017A|2001-11-08|2003-05-21|Sharp Corp|Lock device, lock control system and method for controlling lock|
US8590013B2|2002-02-25|2013-11-19|C. S. Lee Crawford|Method of managing and communicating data pertaining to software applications for processor-based devices comprising wireless communication circuitry|
US7616784B2|2002-07-29|2009-11-10|Robert William Kocher|Method and apparatus for contactless hand recognition|
US6993165B2|2002-12-06|2006-01-31|Cross Match Technologies, Inc.|System having a rotating optical system and a non-planar prism that are used to obtain print and other hand characteristic information|
US20060133651A1|2002-12-31|2006-06-22|Polcha Andrew J|Recoverable biometric identity system and method|
DE10315923A1|2003-04-08|2004-10-28|Tbs Holding Ag|Procedure to detect data of uneven surfaces for biometric data, using non-contact optical sensing of surface|
US6923370B2|2003-05-20|2005-08-02|Bradley L. Gotfried|Access system|
US6992562B2|2003-06-10|2006-01-31|Visteon Global Technologies, Inc.|Biometric keyless entry system|
US8538095B2|2003-06-21|2013-09-17|Aprilis, Inc.|Method and apparatus for processing biometric images|
JP2005063172A|2003-08-13|2005-03-10|Toshiba Corp|Face verifying apparatus and passage controller|
US8181017B2|2004-10-22|2012-05-15|Nds Limited|Certificate renewal|
US20060120568A1|2004-12-06|2006-06-08|Mcconville Patrick J|System and method for tracking individuals|
JP2008536197A|2005-02-22|2008-09-04|コーニンクレッカフィリップスエレクトロニクスエヌヴィ|System and method for transferring media rights under predetermined conditions|
JP4696610B2|2005-03-15|2011-06-08|オムロン株式会社|Subject authentication device, face authentication device, mobile phone, and subject authentication method|
US7783980B1|2005-04-07|2010-08-24|Aol Inc.|Sharing digital items|
US20060294393A1|2005-06-24|2006-12-28|Mc Call Clark E|Remote biometric registration for vehicles|
WO2007000504A1|2005-06-27|2007-01-04|France Telecom|Biometric hand recognition method and associated system and device|
US8026840B2|2005-10-28|2011-09-27|Raytheon Company|Biometric radar system and method for identifying persons and positional states of persons|
JP4826234B2|2005-11-30|2011-11-30|オムロン株式会社|Face authentication apparatus, security strength changing method and program|
DE102006018956A1|2006-04-24|2007-10-25|Robert Bosch Gmbh|Particle`s mass or mass flow determining method for internal-combustion engine, involves arranging sensor in exhaust tract of engine, and comparing measured signal change of sensor with predicted signal change of sensor|
US7983451B2|2006-06-30|2011-07-19|Motorola Mobility, Inc.|Recognition method using hand biometrics with anti-counterfeiting|
US7660442B2|2006-09-01|2010-02-09|Handshot, Llc|Method and system for capturing fingerprints, palm prints and hand geometry|
CN101523427A|2006-09-29|2009-09-02|丹·斯卡梅尔|A system and method for verifying a user's identity in electronic transactions|
JP2008242631A|2007-03-26|2008-10-09|Oki Electric Ind Co Ltd|Iris registration apparatus and iris authentication apparatus|
US20080284726A1|2007-05-17|2008-11-20|Marc Boillot|System and Method for Sensory Based Media Control|
US8126788B2|2007-05-29|2012-02-28|Exaktime Innovations, Inc.|Method for tracking time attendance of either a dedicated user or multiple non-dedicated users, interchangeably, using a single multi-function electronic hand-held device|
CA2636304C|2007-06-27|2014-12-30|Research In Motion Limited|System and method for improving smart card reader reconnections|
JP4693818B2|2007-07-09|2011-06-01|株式会社エヌ・ティ・ティ・ドコモ|Authentication system and authentication method|
GB0714344D0|2007-07-24|2007-09-05|Univ Wales Swansea|Biometric attendance verification|
US10169646B2|2007-12-31|2019-01-01|Applied Recognition Inc.|Face authentication to mitigate spoofing|
JP5186929B2|2008-01-21|2013-04-24|日本電気株式会社|Authentication imaging device|
US9286742B2|2008-03-31|2016-03-15|Plantronics, Inc.|User authentication system and method|
US8358856B2|2008-06-02|2013-01-22|Eastman Kodak Company|Semantic event detection for digital content records|
US20100042940A1|2008-08-14|2010-02-18|Caterpillar Inc.|Geofence system with integrated user interface|
US8175379B2|2008-08-22|2012-05-08|Adobe Systems Incorporated|Automatic video image segmentation|
WO2010050206A1|2008-10-28|2010-05-06|日本電気株式会社|Spoofing detection system, spoofing detection method and spoofing detection program|
US8345932B2|2008-11-24|2013-01-01|International Business Machines Corporation|Support vector machine for biometric data processing|
JP5098973B2|2008-11-27|2012-12-12|富士通株式会社|Biometric authentication device, biometric authentication method, and biometric authentication program|
JP4636171B2|2008-12-17|2011-02-23|トヨタ自動車株式会社|Biometric authentication system for vehicles|
JP2010146502A|2008-12-22|2010-07-01|Toshiba Corp|Authentication processor and authentication processing method|
US20110270712A1|2009-01-06|2011-11-03|X-Ped Hodings Pty Ltd|Arrangement for managing mobile device access to precinct regions containing services and products and information|
US20100191551A1|2009-01-26|2010-07-29|Apple Inc.|Systems and methods for accessing hotel services using a portable electronic device|
JP5436876B2|2009-02-02|2014-03-05|株式会社ディスコ|Grinding method|
US20100246902A1|2009-02-26|2010-09-30|Lumidigm, Inc.|Method and apparatus to combine biometric sensing and other functionality|
US8194938B2|2009-06-02|2012-06-05|George Mason Intellectual Properties, Inc.|Face authentication using recognition-by-parts, boosting, and transduction|
US8638939B1|2009-08-20|2014-01-28|Apple Inc.|User authentication on an electronic device|
US8447119B2|2010-03-16|2013-05-21|Nec Laboratories America, Inc.|Method and system for image classification|
US8326001B2|2010-06-29|2012-12-04|Apple Inc.|Low threshold face recognition|
WO2012020591A1|2010-08-09|2012-02-16|日本電気株式会社|System for identifying individuals, feature value specification device, feature specification method, and recording medium|
US8670935B2|2010-08-17|2014-03-11|Blackberry Limited|Tagging a location by pairing devices|
JP5565285B2|2010-11-19|2014-08-06|コニカミノルタ株式会社|Manufacturing method of glass optical element|
US20120137137A1|2010-11-30|2012-05-31|Brickell Ernest F|Method and apparatus for key provisioning of hardware devices|
KR101816170B1|2010-12-22|2018-01-09|한국전자통신연구원|Apparatus and method for obtaining 3D depth information|
US8457370B2|2011-01-20|2013-06-04|Daon Holdings Limited|Methods and systems for authenticating users with captured palm biometric data|
US8675543B2|2011-02-24|2014-03-18|Verizon Patent And Licensing Inc.|Route limiting in border gateway protocol over satellite networks|
WO2012135861A1|2011-04-01|2012-10-04|Tony Lam|Battery powered passive keyless entry system for premise entry|
WO2012139268A1|2011-04-11|2012-10-18|Intel Corporation|Gesture recognition using depth images|
US9082235B2|2011-07-12|2015-07-14|Microsoft Technology Licensing, Llc|Using facial data for device authentication or subject identification|
US8548207B2|2011-08-15|2013-10-01|Daon Holdings Limited|Method of host-directed illumination and system for conducting host-directed illumination|
CN102426715A|2011-09-30|2012-04-25|华为技术有限公司|Unlocking method for electronic door lock, electronic door lock and electronic door lock system|
US8947202B2|2011-10-20|2015-02-03|Apple Inc.|Accessing a vehicle using portable devices|
US9111402B1|2011-10-31|2015-08-18|Replicon, Inc.|Systems and methods for capturing employee time for time and attendance management|
WO2013100898A1|2011-12-27|2013-07-04|Intel Corporation|Turing test based user authentication and user presence verification system, device, and method|
JP5866216B2|2012-01-31|2016-02-17|株式会社東海理化電機製作所|Electronic key registration system|
US8705070B2|2012-02-24|2014-04-22|Canon Kabushiki Kaisha|Systems and methods for managing use of an imaging device|
US9323912B2|2012-02-28|2016-04-26|Verizon Patent And Licensing Inc.|Method and system for multi-factor biometric authentication|
US20130268418A1|2012-04-04|2013-10-10|Accu-Time Systems, Inc.|Methods and apparatus for wireless communication of time and attendance information|
US20130286161A1|2012-04-25|2013-10-31|Futurewei Technologies, Inc.|Three-dimensional face recognition for mobile devices|
US9070162B2|2012-04-25|2015-06-30|ZR Investments, LLC|Time tracking device and method|
US9047376B2|2012-05-01|2015-06-02|Hulu, LLC|Augmenting video with facial recognition|
JP5780361B2|2012-05-29|2015-09-16|株式会社村田製作所|Electronic key system and electronic equipment|
US8869053B2|2012-07-06|2014-10-21|Sap Ag|Organizer for managing employee time and attendance|
US20140195974A1|2012-08-29|2014-07-10|Identity Validation Products, Llc|Method and apparatus for using a finger swipe interface to control a system|
CN104756135B|2012-09-05|2018-11-23|埃利蒙特公司|System and method for biological characteristic validation relevant to the equipment equipped with camera|
US9740917B2|2012-09-07|2017-08-22|Stone Lock Global, Inc.|Biometric identification systems and methods|
US9002586B2|2012-12-03|2015-04-07|Honda Motor Co., Ltd.|Integrated biometric switch|
US9003196B2|2013-05-13|2015-04-07|Hoyos Labs Corp.|System and method for authorizing access to access-controlled environments|
US9014452B2|2013-08-21|2015-04-21|Seiko Epson Corporation|Orientation-aware average intensity histogram to indicate object boundary depth in ultrasound images|
KR101556599B1|2013-10-30|2015-10-02|연세대학교 산학협력단|Pattern Inputting Apparatus and Method, and Recording Medium Using the Same|
US10027884B2|2014-03-05|2018-07-17|Disney Enterprises, Inc.|Method for capturing photographs and videos on a handheld client device without continually observing the device's screen|
CN106489248A|2014-05-13|2017-03-08|埃利蒙特公司|System and method for the electron key supply related to mobile device and Access Management Access|
US20150348214A1|2014-05-28|2015-12-03|Shailendra Jain|Messaging service for geofence-based automatic time clocking|
US20150347833A1|2014-06-03|2015-12-03|Mark Ries Robinson|Noncontact Biometrics with Small Footprint|
US9965728B2|2014-06-03|2018-05-08|Element, Inc.|Attendance authentication and management in connection with mobile devices|
US10614204B2|2014-08-28|2020-04-07|Facetec, Inc.|Facial recognition authentication system including path parameters|
US9928603B2|2014-12-31|2018-03-27|Morphotrust Usa, Llc|Detecting facial liveliness|
CN107851182B|2015-06-16|2019-04-19|眼验股份有限公司|For palming off the system and method for detection and liveness analysis|
KR20170027189A|2015-09-01|2017-03-09|엘지전자 주식회사|Mobile terminal and control method for the mobile terminal|
US20170186170A1|2015-12-24|2017-06-29|Thomas A. Nugraha|Facial contour recognition for identification|
US9983687B1|2017-01-06|2018-05-29|Adtile Technologies Inc.|Gesture-controlled augmented reality experience using a mobile communications device|EP2203865A2|2007-09-24|2010-07-07|Apple Inc.|Embedded authentication systems in an electronic device|
US9002322B2|2011-09-29|2015-04-07|Apple Inc.|Authentication with secondary approver|
US8769624B2|2011-09-29|2014-07-01|Apple Inc.|Access control utilizing indirect authentication|
CN104756135B|2012-09-05|2018-11-23|埃利蒙特公司|System and method for biological characteristic validation relevant to the equipment equipped with camera|
US20140253711A1|2013-03-07|2014-09-11|Advanced Optical Systems, Inc.|Agile non-contact biometric sensor|
US9898642B2|2013-09-09|2018-02-20|Apple Inc.|Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs|
US10678908B2|2013-09-27|2020-06-09|Mcafee, Llc|Trusted execution of an executable object on a local device|
US9400925B2|2013-11-15|2016-07-26|Facebook, Inc.|Pose-aligned networks for deep attribute modeling|
US20150278499A1|2013-11-21|2015-10-01|Yevgeny Levitov|Motion-Triggered Biometric System for Access Control|
US20150242840A1|2014-02-25|2015-08-27|Jpmorgan Chase Bank, N.A.|Systems and methods for dynamic biometric configuration compliance control|
US9510196B2|2014-03-17|2016-11-29|Qualcomm Incorporated|Method and apparatus for authenticating a user on a mobile device|
CN106489248A|2014-05-13|2017-03-08|埃利蒙特公司|System and method for the electron key supply related to mobile device and Access Management Access|
US9483763B2|2014-05-29|2016-11-01|Apple Inc.|User interface for payments|
US9965728B2|2014-06-03|2018-05-08|Element, Inc.|Attendance authentication and management in connection with mobile devices|
US10032011B2|2014-08-12|2018-07-24|At&T Intellectual Property I, L.P.|Method and device for managing authentication using an identity avatar|
US20160191512A1|2014-12-27|2016-06-30|Mcafee, Inc.|Predictive user authentication|
EP3259688A4|2015-02-19|2018-12-12|Digital Reasoning Systems, Inc.|Systems and methods for neural language modeling|
KR20160148863A|2015-06-17|2016-12-27|한국전자통신연구원|Apparatus for user verification|
CN105654092B|2015-11-25|2019-08-30|小米科技有限责任公司|Feature extracting method and device|
CN105654093B|2015-11-25|2018-09-25|小米科技有限责任公司|Feature extracting method and device|
WO2017093294A1|2015-11-30|2017-06-08|Koninklijke Philips N.V.|Pulse oximetry and contactless patient biometric monitoring system|
US10397208B2|2015-12-11|2019-08-27|Paypal, Inc.|Authentication via item recognition|
CN105741375B|2016-01-20|2018-09-18|华中师范大学|A kind of infrared image Work attendance method of big field-of-view binocular vision|
US9858340B1|2016-04-11|2018-01-02|Digital Reasoning Systems, Inc.|Systems and methods for queryable graph representations of videos|
DK179186B1|2016-05-19|2018-01-15|Apple Inc|REMOTE AUTHORIZATION TO CONTINUE WITH AN ACTION|
DK201670622A1|2016-06-12|2018-02-12|Apple Inc|User interfaces for transactions|
US10303865B2|2016-08-31|2019-05-28|Redrock Biometrics, Inc.|Blue/violet light touchless palm print identification|
US9842330B1|2016-09-06|2017-12-12|Apple Inc.|User interfaces for stored-value accounts|
US10496808B2|2016-10-25|2019-12-03|Apple Inc.|User interface for managing access to credentials for use in an operation|
US10303961B1|2017-04-13|2019-05-28|Zoox, Inc.|Object detection and passenger notification|
US10643051B2|2017-07-13|2020-05-05|Samsung Electronics Co., Ltd.|Optics-based fingerprint sensor, electric device including optics-based fingerprint sensor, and operation method of electric device|
KR20200136504A|2017-09-09|2020-12-07|애플 인크.|Implementation of biometric authentication|
KR102301599B1|2017-09-09|2021-09-10|애플 인크.|Implementation of biometric authentication|
BR112020005325A2|2017-09-18|2020-09-24|Element, Inc.|methods, systems and media to detect forgery in mobile authentication|
CN108307113B|2018-01-26|2020-10-09|北京图森智途科技有限公司|Image acquisition method, image acquisition control method and related device|
US11170085B2|2018-06-03|2021-11-09|Apple Inc.|Implementation of biometric authentication|
US10873577B2|2018-08-17|2020-12-22|Evgeny Chereshnev|Identifying and authorizing user data over a network based on biometric and statistical data|
US11100349B2|2018-09-28|2021-08-24|Apple Inc.|Audio assisted enrollment|
US10860096B2|2018-09-28|2020-12-08|Apple Inc.|Device control using gaze information|
EP3702958A1|2019-02-26|2020-09-02|Identy Inc.|Method for verifying the identity of a user by identifying an object within an image that has a biometric characteristic of the user and separating a portion of the image comprising the biometric characteristic from other portions of the image|
US11200313B2|2019-03-18|2021-12-14|Visa International Service Association|Defense mechanism against component wise hill climbing using synthetic face generators|
US20200302517A1|2019-03-24|2020-09-24|Apple Inc.|User interfaces for managing an account|
US10915725B2|2019-07-01|2021-02-09|Thales Dis Usa Inc.|Method to generate a slap/fingers foreground mask|
法律状态:
2018-11-21| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2020-02-04| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2021-10-19| B350| Update of information on the portal [chapter 15.35 patent gazette]|
2022-01-18| B06A| Patent application procedure suspended [chapter 6.1 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
US201261696820P| true| 2012-09-05|2012-09-05|
PCT/US2013/058343|WO2014039732A2|2012-09-05|2013-09-05|System and method for biometric authentication in connection with camera-equipped devices|
[返回顶部]